Project 2: UNLOCK

Project 2

UNLOCK

Human-Computer Interactions II

Go to Top

0. Introduction

Ravioli Ravioli and PenGUI are HCI projects that center on animated touch-based and sensor based interface for a mobile device that provide an alternative to 'slide to unlock'. It was made in collaboration between Sophia Le, Marela Carlos, and Daniel Tiu.
This page documents the iterative design process that lead to our final products that demonstrated two different techniques with:
  • One based on gesture or multi-touch input
  • The other based on sensor input (e.g. for one-hand use).
Examples of sensor are accelerometer, gyroscope, light, camera, etc. Our products use animation to provide feedback that the system is recognizing the action being taken, and how the system then unlocks the device after the action is completed. Accidental Accidental activation is avoided due the comprehensive and thorough design of our techniques.

1. Design Ideas

The first iteration of our design process consisted of each group member sketching 10+ ideas. Here, our top 7 sketches out of the 30+ are shown.

2. Ten Refined Sketches

From the aforementioned 7 chosen sketches as inspiration, each member drew 10 refined sketches. Keep scrolling to view them all!

3. Final Product

The main idea for Interface A was the selection of button items. We chose this idea because it was simple and intuitive from a user perspective. It also gave us some freedom to be creative in the items we chose for our demo. Part A was developed using Javascript, and features a red unlocked button upon successful passcode entry. The user simply has to press this red unlock button when they want to lock their phone again, confirmed by the green lock symbol. For Interface B, we decided to moved forward with the idea of warming up our penguin using the proximity sensor. The sensor senses the distance between the hand and the phone and unlocks it if a specific distance is measured. We chose this for our implementation because we liked the idea of warming up a character and thought the proximity sensor could be interesting to use for the basic task of unlocking your phone. This implementation is a prototype, with the mockup done on Figma, and the sensor functionality and animations done using Protopie. To enhance the user experience, we added phone vibrations along with on-screen modals to give users feedback on their actions. For this implementation, we found a detail about the proximity sensor on iPhones in which we realized it was the sensor responsible for making the screen go black when you make a call on your phone to prevent accidental button presses with your ear. That being said, this cannot be disabled on the iPhone so we devised a technique for the prototype to prevent the black screen from showing up in the demo that could be confusing without knowing the above context. Below is a video summarizing the iterative design process for our interfaces A and B.

My contributions:
  • Sketched design ideas that were eventually included in both final implementations/li>
  • Participated in voting for top ideas sketches after completing the initial and refined sketches
  • Collaborated with Daniel write the HTML, CSS, and Javascript code for Interface A
  • Found and exported images to be used for Interface A
  • Recorded voice clips to narrate video demo
  • Recorded demonstration of Interface A for video demo
  • Collaborated with Marela and Daniel to write and edit video demo script